11 research outputs found
Recommended from our members
Probability-driven motion planning for mobile robots
This paper proposes a path-planning method for mobile robots in the presence of uncertainty. We analyze environment and control uncertainty and propose methods for incorporating each of them into the planning algorithm. We model the environment using the pyramid structure that encodes the information on occupancy probabilities for each pixel as well as the partial information on conditional probabilities among different pixels. This structure allows for efficient and accurate computation of collision probabilities in the presence of environment uncertainty. The control uncertainty is mainly characterized by its expansion in space and time and is accordingly modeled by a stochastic differential equation that mathematically captures this phenomenon. Models that we develop are inevitably approximate but experiments confirm that they can be used as a reasonable model for motion planning. We have conducted a series of experiments on the mobile platform and some of these results are presented
Recommended from our members
Real-time visual servoing
A real-time tracking algorithm in conjunction with a predictive filter to allow real-time visual servoing of a robotic arm that is tracking a moving object is described. The system consists of two calibrated (but unregistered) cameras that provide images to a real-time, pipeline-parallel optic-flow algorithm that can robustly compute optic-flow and calculate the 3-D position of a moving object at approximately 5-Hz rates. These 3-D positions of the moving object serve as input to a predictive kinematic control algorithm that uses an α-β-γ filter to update the position of a robotic arm tracking the moving object. Experimental results are presented for the tracking of a moving model train in a variety of different trajectories
Recommended from our members
Modeling dynamic uncertainty in robot motions
A method for modeling uncertainties that exist in a robotic system, based on stochastic differential equations, is presented. The use of such a model permits the capture in an analytical structure of the ability to properly express uncertainty within the motion descriptions and the dynamic, changing nature of the task and its constraints. With respect to the dynamic nature of robotic motion tasks, the model of the environment uncertainty proposed is dynamic rather than static. The amount of knowledge about the environment is allowed to change as the robot moves. These results suggest that computational models traditionally found in the lower levels in robot systems may have application in the upper planning levels as well. Some experimental results using the model are presented
Recommended from our members
Hand-eye coordination for grasping moving objects
Most robotic grasping tasks assume a stationary or fixed object. In this paper, we explore the requirements for grasping a moving object. This task requires proper coordination between at least 3 separate subsystems: dynamic vision sensing, real-time arm control, and grasp control. As with humans, our system first visually tracks the object's 3-D position. Because the object is in motion, this must be done in a dynamic manner to coordinate the motion of the robotic arm as it tracks the object. The dynamic vision system is used to feed a real-time arm control algorithm that plans a trajectory. The arm control algorithm is implemented in two steps: 1) filtering and prediction, and 2) kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. We present 3 different strategies for intercepting the object and results from the tracking algorithm
Recommended from our members
Automated tracking and grasping of a moving object with a robotic hand-eye system
An attempt to achieve a high level of interaction between a real-time vision system capable of tracking moving objects in 3-D and a robot arm with gripper that can be used to pick up a moving object is described. The interplay of hand-eye coordination in dynamic grasping tasks such as grasping of parts on a moving conveyor system, assembly of articulated parts, or for grasping from a mobile robotic system is explored. The goal is to build an integrated sensing and actuation system that can operate in dynamic as opposed to static environments. The system built addresses three distinct problems in using robotic hand-eye coordination for grasping moving objects: fast computation of 3-D motion parameters from vision, predictive control of a moving robotic arm to track a moving object, and interception and grasping. The system operates at approximately human arm movement rates. Experimental results in which a moving model train is tracked, stably grasped, and picked up by the system are presented. The algorithms developed to relate sensing to actuation are quite general and applicable to a variety of complex robotic tasks
Recommended from our members
Planning velocity profiles from task-level constraints and environment uncertainties
A method for parameterizing robot trajectories in the presence of uncertainties is presented. The planning process is defined as a problem of constrained optimization and the concept of a task's difficulty is used as an optimization criterion. The task difficulty, as defined by the authors, comprises the combined effects of velocity and uncertainty, mimicking human perception of difficulty in positioning tasks. The success probability is used as a constraint necessary for planning tasks with contradicting requirements. This planning paradigm is demonstrated with an experiment that contains opposing requirements: reaching the obstacle in a given time, but without exceeding certain maximal impact force. The planner is implemented on a real system
Recommended from our members
Trajectory filtering and prediction for automated tracking and grasping of a moving object
The authors explore the requirements for grasping a moving object. This task requires proper coordination between at least three separate subsystems: real-time vision sensing, trajectory-planning/arm-control, and grasp planning. As with humans, the system first visually tracks the object's 3D position. Because the object is in motion, this must be done in real-time to coordinate the motion of the robotic arm as it tracks the object. The vision system is used to feed an arm control algorithm that plans a trajectory. The arm control algorithm is implemented into two steps: filtering and prediction and kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. Experimental results are presented in which a moving model train was tracked, stably grasped, and picked up by the system
Real-Time Visual Servoing
This paper describes a new real-time tracking algorithm in conjunction with a predictive filter to allow real-time visual servoing of a robotic arm that is following a moving object. The system consists of two calibrated (but unregistered) cameras that provide images to a real-time, pipelined-parallel optic-flow algorithm that can robustly compute optic-flow and calculate the 3-D position of a moving object at approximately 5 Hz rates. These 3-D positions of the moving object serve as input to a predictive kinematic control algorithm that uses an a - b - g filter to update the position of a robotic arm tracking the moving object. Experimental results are presented for the tracking of a moving model train in a variety of different trajectories. 1. INTRODUCTION Tracking the three-dimensional movement of objects by a vision system in real-time is an important problem. It has been addressed by researchers in a number of different fields including target tracking, surveillance, automated ..
APHRODITE: Intelligent Planning, Control and Sensing in a Distributed Robotic System
: In this paper we describe a general-purpose robot programming environment APHRODITE 1 built at Columbia University's Center for Research in Intelligent Systems. The environment is based on a distributed multiprocessor architecture that offers great flexibility and computing strength, and provides a reliable real-time response, while using off-the-shelf software tools. The supporting software that we developed supports transparent operation across the network and distributed interupt handling, We define the interface between main hierarchical levels in the system and describe the implementation as it has been used to create a distributed, multi-tasking application. We have implemented a test task that consists of assembly operation that includes compliant motion planning in an uncertain environment and the intelligent monitoring of the assembly process. 1. Introduction A typical robotic system is usually comprised of many different functional processing modules. One of the major pr..